parameters: $ l_r = 7\times10^{-5}, \beta_1=0.05, \beta_2=0.9$, LAMBDA=16, ncritic=3 and batch_size=32
Train once with $60$ epochs:
Figure 5
Figure 6
Figure 7
Another way to train WGAN: After 27 epochs, reset the Adam optimizer and train another 27 epochs. In this way the Generator loss will probably converge.
Results are as follows:
Figure 8 (Left: Second 27-epoch training.)
Figure 9